	article();
	// By Strauss

==========================
Programming Paradigms
==========================
* Note: This is best viewed with Word Wrap turned off, and the  *
* NotePad window maximized- note, some scrolling required.      *
{
	Alright, so you got a computer, you got access to the amazing world of the internet
and now you wanna be a hacker (I'll abstract the sermon "what a hacker is"...that should 
be a whole new article). But you wanna be 31337! You want to...p-p-p...program! Another 
situation: you got nothing to do with the hacker world, but you wanna be able to tell the 
computer specific instructions for it to follow. Just because you find it cool, maybe. Or 
because have this idea for a program that you couldn't find in the internet, or that you 
found, but works terrible. You might also be someone who has this idea: "I want these
machines to give me some money...I'll write a program...I wanna be Bill Gates".
	No matter what your situation is, if you wanna start writing your own source codes,
one thing that you may already know is that programs can be written in different "programming
languages". If you didn't know that, then you probably developed this so-called computer 
programming interest in the last 30 seconds. Anyway, you google (actually I use altavista) for 
some info on this and, suddenly, you find yourself drowned in a sea of different languages and
in a desperate cry for help you ask "what language should I learn and concentrate my efforts
on?".
	Ok, now there we go. If your research goes just a little further on this you'll
find that programming langugages follow programming paradigms, which can be thought as
theoretical frameworks that standardize ways of effectively programming software systems.
Thomas Khn once defined 'paradigm' as "essentially...a collection of beliefs shared by
scientists,...a set of agreements over how [the] problems [of a field] will be understood
and treated...", so apply that the software programming field and you got it. A programming
paradigm is often associated with a family or programming languages. And most languages
support only one paradigm, but some are capable support multiple.
	So, since programs started being actually written many computer scientists came up
with lots of different programming paradigms. Some unpopular and weird, and some you might
have heard of. Next thing you know you got Functional Paradigm, Logical Paradigm, Object-Oriented
Paradigm, Database Paradigm, Distributed Paradigm, Multi-Paradigm "Paradigm", Structured
Paradigm, Aspect-Oriented Paradigm...just to name a few. And, of course, with that comes
another load of programming languages: C, C++, C#, Java, Perl, Visual Basic, COBOL, FORTRAN,
Python, Prolog, Delphi...remember that sea of languages? (Note: some of these are not really
programming languages, but scripting languages, like Perl and Python)
	My opinion on this is that you should get to know the pros and cons of a language and
decide which one better suits your needs. C is an relatively old language that still is very
used worldwide, COBOL is a quite old language that still has its (few, but faithful) applications,
and Java and the objected-oriented programming are slowly conquering the marketplace. I guess
people should have at least the slightest idea of the usage, complexity, and syntax of a
language so they can decide with wisdom which one to use face a problem. If you wanna be a
professional of the field, then you should also be aware of new technologies and analyze how
they can affect your job, since a new paradigm can make you completely out of date. Besides,
the more you learn new languages/paradigms, the more it becomes an easier task.
	In my specific case, I'm in love with object-oriented programming and currently use
the Java platform, though I intend to work a little with Microsoft .NET soon. The concept of
these platforms amazes me. Now you don't write your programs directly to the machine, to
the hardware, but instead you write your codes on a mobile framework that can, at least
theorically, move/adapt to almost any kind of hardware and then running on a runtime environment
("write once, run anywhere"). Of course, there are some issues: initially, the user of the 
compiled code must have the runtime environment installed on his machine, which can be a pain
in the ass. But it's kinda obvious and inevitable since this mobility is achieved because the
program run on the Java Virtual Machine (for example) running on a physical computer. Among
the rival platforms you can find some pros and cons: .NET supports a variety of languages,
Java has its mobility in a more mature state, .NET supports web services, and so on. Java is
an envolving new technology, many software factories worldwide are migrating to the object-oriented
paradigm, and the abstraction of stuff like memory allocation is just great, sometimes. But I'm
having trouble to write network applications because Java is still a limited language and, in
this case, C gives me much more freedom and a wider range of possibilities, though the writing
itself gets more complex.
	Java won't be the last language. Java won't be the "only" language. Nor will be C or any
language at all. In a short, know your territory, choose your favorite weapon, but, most important,
have your artillery set up for every possible kind of enemy because, like someone said, knowledge
is power!
}